Minimax Entropy Principle and Its Applicationto Texture
نویسندگان
چکیده
This article proposes a general theory and methodology, called the minimax entropy principle, for building statistical models for images (or signals) in a variety of applications. This principle consists of two parts. The rst is the maximum entropy principle for feature binding (or fusion): for a given set of observed feature statistics, a distribution can be built to bind these feature statistics together by maximizing the entropy over all distributions that reproduce these feature statistics. The second part is the minimum entropy principle for feature selection: among all plausible sets of feature statistics, we choose the set whose maximum entropy distribution has the minimum entropy. Computational and inferential issues in both parts are addressed, in particular, a feature pursuit procedure is proposed for approximately selecting the optimal set of features. The minimax entropy principle is then corrected by considering the sample variation in the observed feature statistics, and an information criterion for feature pursuit is derived. The minimax entropy principle is applied to texture modeling, where a novel Markov random eld (MRF) model, called FRAME (Filter, Random eld, And Minimax Entropy), is derived, and encouraging results are obtained in experiments on a variety of texture images. Relationship between our theory and the mechanisms of neural computation is also discussed.
منابع مشابه
Minimax Entropy Principle and Its Application to Texture Modeling
This article proposes a general theory and methodology, called the minimax entropy principle, for building statistical models for images (or signals) in a variety of applications. This principle consists of two parts. The first is the maximum entropy principle for feature binding (or fusion): for a given set of observed feature statistics, a distribution can be built to bind these feature stati...
متن کاملLearning Inhomogeneous Gibbs Model of Faces by Minimax Entropy
In this paper we propose a novel inhomogeneous Gibbs model by the minimax entropy principle, and apply it to face modeling. The maximum entropy principle generalizes the statistical properties of the observed samples and results in the Gibbs distribution, while the minimum entropy principle makes the learnt distribution close to the observed one. To capture the fine details of a face, an inhomo...
متن کاملRegularized Minimax Conditional Entropy for Crowdsourcing
There is a rapidly increasing interest in crowdsourcing for data labeling. By crowdsourcing, a large number of labels can be often quickly gathered at low cost. However, the labels provided by the crowdsourcing workers are usually not of high quality. In this paper, we propose a minimax conditional entropy principle to infer ground truth from noisy crowdsourced labels. Under this principle, we ...
متن کاملLearning from the Wisdom of Crowds by Minimax Entropy
An important way to make large training sets is to gather noisy labels from crowds of nonexperts. We propose a minimax entropy principle to improve the quality of these labels. Our method assumes that labels are generated by a probability distribution over workers, items, and labels. By maximizing the entropy of this distribution, the method naturally infers item confusability and worker expert...
متن کاملMiniMax Entropy and Maximum Likelihood Complementarity of Tasks, Identity of Solutions
Concept of exponential family is generalized by simple and general exponential form. Simple and general potential are introduced. Maximum Entropy and Maximum Likelihood tasks are defined. ML task on the simple exponential form and ME task on the simple potentials are proved to be complementary in setup and identical in solutions. ML task on the general exponential form and ME task on the genera...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1997